Source: arstechnica.com 11/21/22
Teens will finally have a way to proactively stop the spread of intimate images.
Last year, the National Center for Missing and Exploited Children (NCMEC) released data showing that it received overwhelmingly more reports of child sexual abuse materials (CSAM) from Facebook than any other web service it tracked. Where other popular social platforms like Twitter and TikTok had tens of thousands of reports, Facebook had 22 million.
Today, Facebook announced new efforts to limit the spread of some of that CSAM on its platforms. Partnering with NCMEC, Facebook is building a “global platform” to prevent “sextortion” by helping “stop the spread of teens’ intimate images online.”
“We’re working with the National Center for Missing and Exploited Children (NCMEC) to build a global platform for teens who are worried intimate images they created might be shared on public online platforms without their consent,” Antigone Davis, Facebook’s VP, global head of safety, said in a blog post on Monday.
This global platform for teens will work similarly to the platform that Meta created to help adults combat “revenge porn,” Davis said, which Facebook said last year was “the first global initiative of its kind.” It lets users generate a hash to proactively stop images from being distributed on Facebook and Instagram.
According to Davis, Meta found that more than 75 percent of the child exploitative content that proliferates on its platforms at rates that outpace other social media is posted by people “with no apparent intention of harm.” Instead, the CSAM gets shared to express outrage, disgust, or “poor humor,” Davis said.
